# Korean Optimization

Kakaocorp.kanana 1.5 8b Instruct 2505 GGUF
Kanana-1.5-8B-Instruct-2505 is an 8B-parameter instruction fine-tuned language model developed by Kakao Corp, suitable for text generation tasks.
Large Language Model
K
DevQuasar
483
1
Trillion 7B Preview AWQ
Apache-2.0
The Trillion-7B Preview is a multilingual large language model supporting English, Korean, Japanese, and Chinese. It outperforms other 7B-scale models in computational efficiency and performance.
Large Language Model Supports Multiple Languages
T
trillionlabs
22
4
Trillion 7B Preview
Apache-2.0
The Trillion-7B Preview is a multilingual large language model supporting English, Korean, Japanese, and Chinese. It achieves performance competitive with higher-computation models while maintaining lower computational requirements.
Large Language Model Transformers Supports Multiple Languages
T
trillionlabs
6,864
82
K Finance Sentence Transformer
This is a sentence-transformers-based sentence embedding model that maps text to a 768-dimensional vector space, suitable for semantic search and clustering tasks.
Text Embedding Transformers
K
ohsuz
160
1
Synatra 7B V0.3 RP
Synatra-7B-v0.3-RP is a Korean large language model fine-tuned based on Mistral-7B-Instruct-v0.1, specializing in role-play and dialogue generation.
Large Language Model Transformers Korean
S
maywell
3,587
24
Sentence Transformer Klue
This is a sentence-transformers-based model that maps sentences and paragraphs into a 768-dimensional dense vector space, suitable for tasks such as clustering or semantic search.
Text Embedding Transformers
S
hunkim
17
1
Ke T5 Base
Apache-2.0
KE-T5 is a text-to-text transformation model based on the T5 architecture, developed by the Korea Electronics Technology Institute, supporting various NLP tasks.
Large Language Model Supports Multiple Languages
K
KETI-AIR
3,197
22
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase